Natural Language Enhancement for English Teaching Using Character-Level Recurrent Neural Network with Back Propagation Neural Network based Classification by Deep Learning Architectures
نویسندگان
چکیده
Natural Language Processing (NLP) is an efficient method for enhancing educational outcomes. In settings, implementing NLP entails starting the learning process through natural acquisition. English teaching and have received increased attention from relevant education departments as integral aspect of new curriculum reform. The environment undergoing extraordinary changes a result constant improvement extension level scale, well growth Internet information technology. As result, current research aims to look into techniques efficiently using AI (artificial intelligence) apps teach learn perspective university students. This can measure levels effectiveness employment applications based on deep techniques. There, language enhancement has been carried out Character-level recurrent neural network with back Propagation (Cha_RNN_BPNN) classification. With help this DL (deep learning) technique, it possible use methods assist teachers in analysing diagnosing students' behaviour, replacing part answer questions timely manner, automatically grading assignments during process. Experimental analysis shows Word Perplexity, Flesch-Kincaid (F-K) Grade Level Readability, Cosine Similarity Semantic Coherence, gradient change NN, validation accuracy, training accuracy proposed technique.
منابع مشابه
Neural Network-based English Alphanumeric Character Recognition
Propose a neural-network based size and color invariant character recognition system using feed-forward neural network. Our feed-forward network has two layers. One is input layer and another is output layer. The whole recognition process is divided into four basic steps such as pre-processing, normalization, network establishment and recognition. Pre-processing involves digitization, noise rem...
متن کاملLayer-wise Relevance Propagation for Deep Neural Network Architectures
We present the application of layer-wise relevance propagation to several deep neural networks such as the BVLC reference neural net and googlenet trained on ImageNet and MIT Places datasets. Layerwise relevance propagation is a method to compute scores for image pixels and image regions denoting the impact of the particular image region on the prediction of the classifier for one particular te...
متن کاملConvolutional Neural Network Architectures for Matching Natural Language Sentences
Semantic matching is of central importance to many natural language tasks [2, 28]. A successful matching algorithm needs to adequately model the internal structures of language objects and the interaction between them. As a step toward this goal, we propose convolutional neural network models for matching two sentences, by adapting the convolutional strategy in vision and speech. The proposed m...
متن کاملNatural language acquisition in recurrent neural architectures
English The human brain is one of the most complex dynamic systems that enables us to communicate (and externalise) information by natural language. Our languages go far beyond single sounds for expressing intentions – in fact, human children already join discourse by the age of three. It is remarkable that in these first years they show a tremendous capability in acquiring the language compete...
متن کاملDeep Gate Recurrent Neural Network
This paper explores the possibility of using multiplicative gate to build two recurrent neural network structures. These two structures are called Deep Simple Gated Unit (DSGU) and Simple Gated Unit (SGU), which are structures for learning long-term dependencies. Compared to traditional Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), both structures require fewer parameters and le...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Universal Computer Science
سال: 2022
ISSN: ['0948-695X', '0948-6968']
DOI: https://doi.org/10.3897/jucs.94162